Skip to content

chore: Update Image name retrieved field to fix the params-validator #1573

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Aug 1, 2025

Conversation

atheo89
Copy link
Member

@atheo89 atheo89 commented Jul 31, 2025

Description

Check params validator fails because wasn't updated based the latest changes (Onboard RStudio to konflux as well): https://github.com/opendatahub-io/notebooks/actions/runs/16630011060/job/47056679708?pr=1570#logs

How Has This Been Tested?

Now GHA is green: https://github.com/atheo89/notebooks/actions/runs/16645240481/job/47104233706

Merge criteria:

  • The commits are squashed in a cohesive manner and have meaningful messages.
  • Testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious).
  • The developer has manually tested the changes and verified that the changes work

Summary by CodeRabbit

  • Chores
    • Updated internal build name references for specific image variables to use a generic value. No user-facing changes.

Copy link
Contributor

coderabbitai bot commented Jul 31, 2025

Walkthrough

The script ci/check-params-env.sh was updated to change the expected OpenShift build name for two specific image variables from their previous explicit names to a generic value, "konflux". No other logic or function signatures were changed.

Changes

Cohort / File(s) Change Summary
CI Script Build Name Update
ci/check-params-env.sh
Updated expected build names for odh-workbench-rstudio-minimal-cpu-py311-c9s-n and odh-workbench-rstudio-minimal-cuda-py311-c9s-n variables to "konflux". No other logic was changed.

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~2 minutes

Possibly related PRs

Suggested labels

lgtm, size/xs

Suggested reviewers

  • dibryant

Note

⚡️ Unit Test Generation is now available in beta!

Learn more here, or try it out under "Finishing Touches" below.


📜 Recent review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2778c28 and 7bfd111.

📒 Files selected for processing (1)
  • ci/check-params-env.sh (2 hunks)
🧰 Additional context used
🧠 Learnings (2)
📓 Common learnings
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-12T20:04:31.021Z
Learning: Issue #862 in opendatahub-io/notebooks reveals that the Docker image tag length fix from issue #631 is mathematically insufficient. While the fix correctly limits branch names to 40 characters, this fails for longer target names like "runtime-minimal-ubi9-python-3.11" (33 chars) which still produces tags exceeding the 128-character Docker limit. The fix needs dynamic calculation based on actual target name length rather than a fixed 40-character assumption.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1519
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/runtime-images/llmcompressor-pytorch-ubi9-py311.json:2-9
Timestamp: 2025-07-29T16:00:31.637Z
Learning: jiridanek indicated that the current practice for runtime-images JSON files in opendatahub-io/notebooks has changed significantly from the SHA256 digest pinning pattern, and that rebasing PR #1519 would reveal the new practice which is "something completely different" from the existing pattern.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1401
File: manifests/overlays/additional/runtime-datascience-imagestream-beta.yaml:8-10
Timestamp: 2025-07-16T10:44:40.023Z
Learning: atheo89 pointed out that the doubled slash issue in runtime-image-url annotations affects multiple imagestream files repository-wide (11 files total), not just individual files. When systematic issues like this are identified during PR review, they should be addressed as separate comprehensive issues rather than individual file fixes, especially when they're outside the scope of the current PR. Issue #1402 was created to track this systematic problem.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/kustomize/base/service.yaml:5-15
Timestamp: 2025-07-02T18:59:15.788Z
Learning: jiridanek creates targeted GitHub issues for specific test quality improvements identified during PR reviews in opendatahub-io/notebooks. Issue #1268 demonstrates this by converting a review comment about insufficient tf2onnx conversion test validation into a comprehensive improvement plan with clear acceptance criteria, code examples, and ROCm-specific context.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T09:12:44.088Z
Learning: jiridanek requested GitHub issue creation for GitHub Actions artifact naming conflict during PR #1357 review, specifically for a failing actions/upload-artifact@v4 step with 409 Conflict error. Issue was created with comprehensive problem description covering artifact naming conflicts, root cause analysis of duplicate names in concurrent workflows, four solution options (enhanced naming, overwriting, conditional uploads, matrix-aware naming) with code examples, detailed acceptance criteria, implementation guidance, testing approach, and proper context linking, continuing the established pattern of systematic CI/CD and code quality improvements through detailed issue tracking.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1401
File: manifests/overlays/additional/runtime-datascience-imagestream-beta.yaml:8-10
Timestamp: 2025-07-16T10:44:40.023Z
Learning: atheo89 pointed out that the doubled slash issue in runtime-image-url annotations affects multiple imagestream files repository-wide (11 files total), not just individual files. When systematic issues like this are identified during PR review, they should be addressed as separate comprehensive issues rather than individual file fixes, especially when they're outside the scope of the current PR.
Learnt from: atheo89
PR: opendatahub-io/notebooks#1401
File: manifests/overlays/additional/runtime-datascience-imagestream-beta.yaml:8-10
Timestamp: 2025-07-16T10:44:40.023Z
Learning: atheo89 pointed out that the doubled slash issue in runtime-image-url annotations affects multiple imagestream files repository-wide, not just individual files. When systematic issues like this are identified during PR review, they should be addressed as separate comprehensive issues rather than individual file fixes, especially when they're outside the scope of the current PR.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing `runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml` file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T11:22:11.653Z
Learning: When analyzing OpenShift CI/Prow job failures in opendatahub-io/notebooks, jobs that appear to "complete successfully" may actually have failed at the image build stage and never executed any e2e tests. Proper analysis requires checking if image builds succeeded before concluding that tests passed or the job was successful.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T11:22:11.653Z
Learning: When analyzing OpenShift CI/Prow job failures in opendatahub-io/notebooks, jobs that appear to "complete successfully" may actually have failed at the image build stage and never executed any e2e tests. Proper analysis requires checking if image builds succeeded before concluding that tests passed or the job was successful.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1497
File: .tekton/odh-pipeline-runtime-minimal-cpu-py312-ubi9-pull-request.yaml:0-0
Timestamp: 2025-07-24T14:05:28.815Z
Learning: jiridanek confirmed that CodeRabbit's original analysis about missing additional-tags parameter in Tekton pipelines was practically correct. While builds can succeed without the parameter declaration due to Tekton's flexibility, having the default parameter declaration becomes valuable when removing explicit argument values, providing better maintainability and cleaner configuration management. Build success doesn't always indicate optimal implementation.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-03T14:04:05.280Z
Learning: In opendatahub-io/notebooks, shell scripting robustness and efficiency issues in extension management loops (such as unquoted globs, word-splitting, and unnecessary repeated copies in run-code-server.sh) are systematically tracked and fixed by quoting variables, using `cp -a --`, and toggling `nullglob`, following the established pattern of operational correctness improvements.
Learnt from: jiridanek
PR: opendatahub-io/notebooks#1154
File: manifests/base/jupyter-tensorflow-notebook-imagestream.yaml:45-45
Timestamp: 2025-06-13T08:34:01.300Z
Learning: When updating dependency versions in `manifests/base/*-imagestream.yaml`, the project convention is to modify only the newest tag (e.g., "2025.1") and intentionally leave earlier tags (e.g., "2024.2") unchanged.
ci/check-params-env.sh (19)

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:42-52
Timestamp: 2025-07-09T12:29:56.162Z
Learning: jiridanek requested GitHub issue creation for OpenShift client architecture mapping problem affecting 29 Dockerfiles during PR #1320 review. Issue was created with comprehensive analysis covering all affected files using $(uname -m) returning 'aarch64' but OpenShift mirror expecting 'arm64', systematic solution using BuildKit TARGETARCH mapping with proper amd64→x86_64 and arm64→arm64 conversion, detailed acceptance criteria, and implementation guidance, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T11:22:11.653Z
Learning: When analyzing OpenShift CI/Prow job failures in opendatahub-io/notebooks, jobs that appear to "complete successfully" may actually have failed at the image build stage and never executed any e2e tests. Proper analysis requires checking if image builds succeeded before concluding that tests passed or the job was successful.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-09T11:22:11.653Z
Learning: When analyzing OpenShift CI/Prow job failures in opendatahub-io/notebooks, jobs that appear to "complete successfully" may actually have failed at the image build stage and never executed any e2e tests. Proper analysis requires checking if image builds succeeded before concluding that tests passed or the job was successful.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-11T11:16:05.131Z
Learning: jiridanek requested GitHub issue creation for RStudio py311 Tekton push pipelines during PR #1379 review. Issue #1384 was successfully created covering two RStudio variants (CPU and CUDA) found in manifests/base/params-latest.env, with comprehensive problem description, implementation requirements following the same pattern as other workbench pipelines, clear acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: grdryn
PR: #1320
File: rstudio/rhel9-python-3.11/Dockerfile.cuda:34-35
Timestamp: 2025-07-04T10:41:13.061Z
Learning: In the opendatahub-io/notebooks repository, when adapting NVIDIA CUDA Dockerfiles, the project intentionally maintains consistency with upstream NVIDIA patterns even when it might involve potential risks like empty variable expansions in package installation commands. This is considered acceptable because the containers only run on RHEL 9 with known yum/dnf behavior, and upstream consistency is prioritized over defensive coding practices.

Learnt from: jiridanek
PR: #1519
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/runtime-images/llmcompressor-pytorch-ubi9-py311.json:2-9
Timestamp: 2025-07-29T16:00:31.637Z
Learning: jiridanek indicated that the current practice for runtime-images JSON files in opendatahub-io/notebooks has changed significantly from the SHA256 digest pinning pattern, and that rebasing PR #1519 would reveal the new practice which is "something completely different" from the existing pattern.

Learnt from: jiridanek
PR: #1306
File: jupyter/trustyai/ubi9-python-3.12/kustomize/base/kustomization.yaml:8-12
Timestamp: 2025-07-08T19:09:48.746Z
Learning: jiridanek requested GitHub issue creation for misleading CUDA prefix in TrustyAI image tags during PR #1306 review, affecting both Python 3.11 and 3.12 versions. Issue #1338 was created with comprehensive problem description covering both affected images, repository pattern analysis comparing correct vs incorrect naming conventions, clear solution with code examples, detailed acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: #1269
File: codeserver/ubi9-python-3.12/kustomize/base/pod.yaml:11-11
Timestamp: 2025-07-03T16:17:17.301Z
Learning: jiridanek requested GitHub issue creation for renaming placeholder image reference in codeserver/ubi9-python-3.12/kustomize/base/pod.yaml during PR #1269 review to improve code self-documentation. Issue #1313 was created with comprehensive problem description, multiple implementation options (UPPERCASE_WITH_UNDERSCORES, lowercase-with-dashes, environment variable style), acceptance criteria, and proper context linking, continuing the established pattern of systematic code quality improvements through detailed issue tracking.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-03T14:04:05.280Z
Learning: In opendatahub-io/notebooks, shell scripting robustness and efficiency issues in extension management loops (such as unquoted globs, word-splitting, and unnecessary repeated copies in run-code-server.sh) are systematically tracked and fixed by quoting variables, using cp -a --, and toggling nullglob, following the established pattern of operational correctness improvements.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-tensorflow-notebook-imagestream.yaml:45-45
Timestamp: 2025-06-13T08:34:01.300Z
Learning: When updating dependency versions in manifests/base/*-imagestream.yaml, the project convention is to modify only the newest tag (e.g., "2025.1") and intentionally leave earlier tags (e.g., "2024.2") unchanged.

Learnt from: jiridanek
PR: #1519
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/runtime-images/llmcompressor-pytorch-ubi9-py311.json:2-9
Timestamp: 2025-07-29T15:56:20.036Z
Learning: Current practice for runtime-images JSON files in opendatahub-io/notebooks: All runtime image configurations use SHA256 digest pinning (format: @sha256:<digest>) instead of mutable tags for immutability, use quay.io/opendatahub/workbench-images registry, include schema_name: "runtime-image" field, and follow consistent structure as seen in existing files like datascience-ubi9-py311.json, pytorch-ubi9-py311.json, etc.

Learnt from: jiridanek
PR: #1154
File: manifests/base/jupyter-pytorch-notebook-imagestream.yaml:0-0
Timestamp: 2025-06-16T11:06:33.139Z
Learning: In the opendatahub-io/notebooks repository, N-1 versions of images in manifest files (like imagestream.yaml files) should not be updated regularly. The versions of packages like codeflare-sdk in N-1 images are frozen to what was released when the image was moved from N to N-1 version. N-1 images are only updated for security vulnerabilities of packages, not for regular version bumps. This is why the version of packages in N-1 images may be quite old compared to the latest N version.

Learnt from: jiridanek
PR: #1519
File: jupyter/pytorch+llmcompressor/ubi9-python-3.11/runtime-images/llmcompressor-pytorch-ubi9-py311.json:2-9
Timestamp: 2025-07-30T08:42:19.393Z
Learning: The new architecture for runtime images in opendatahub-io/notebooks uses a dynamic system instead of static JSON files: (1) Runtime images are defined as ImageStreams in manifests/base/ with opendatahub.io/runtime-image: "true" label and JSON metadata in opendatahub.io/runtime-image-metadata annotation, (2) odh-notebook-controller dynamically creates JSONs in configmaps by listing ImageStreams on cluster, (3) Elyra-enabled workbenches get these JSONs mounted and setup-elyra.sh moves them from /opt/app-root/pipeline-runtimes/..data/*.json to /opt/app-root/share/jupyter/metadata/runtime-images/. Static JSON files in */runtime-images/ directories are no longer used.

Learnt from: jiridanek
PR: #1259
File: jupyter/rocm/tensorflow/ubi9-python-3.12/Pipfile:13-15
Timestamp: 2025-07-03T07:03:45.020Z
Learning: The Python 3.11 infrastructure for ROCm TensorFlow images in opendatahub-io/notebooks is already properly configured in the Makefile with both BASE_DIRS entries for Pipfile lock renewals and all-images targets for CI builds, requiring only commenting out the corresponding Python 3.12 entries when downgrading due to wheel availability issues.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-06-16T11:32:09.203Z
Learning: In the opendatahub-io/notebooks repository, there is a known issue with missing runtimes/rocm/pytorch/ubi9-python-3.11/kustomize/base/kustomization.yaml file that causes rocm runtime tests to fail with "no such file or directory" error. This is tracked in JIRA RHOAIENG-22044 and was intended to be fixed in PR #1015.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-12T20:04:31.021Z
Learning: Issue #862 in opendatahub-io/notebooks reveals that the Docker image tag length fix from issue #631 is mathematically insufficient. While the fix correctly limits branch names to 40 characters, this fails for longer target names like "runtime-minimal-ubi9-python-3.11" (33 chars) which still produces tags exceeding the 128-character Docker limit. The fix needs dynamic calculation based on actual target name length rather than a fixed 40-character assumption.

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T14:03:17.456Z
Learning: In the opendatahub-io/notebooks repository, MSSQL repo files are named with x86_64 in the filename but the repo content itself is architecture-agnostic. The right RPMs get installed for the right architectures automatically, making the issue a file naming problem rather than a repo content problem.

Learnt from: jiridanek
PR: #1320
File: jupyter/pytorch/ubi9-python-3.12/Dockerfile.cuda:65-66
Timestamp: 2025-07-09T14:03:17.456Z
Learning: In the opendatahub-io/notebooks repository, MSSQL repo files are named with x86_64 in the filename but the repo content itself is architecture-agnostic. The right RPMs get installed for the right architectures automatically, making the issue a file naming problem rather than a repo content problem.

Learnt from: jiridanek
PR: opendatahub-io/notebooks#0
File: :0-0
Timestamp: 2025-07-01T10:41:56.419Z
Learning: In the opendatahub-io/notebooks repository, TensorFlow packages with extras = ["and-cuda"] can cause build conflicts on macOS due to platform-specific CUDA packages. When the Dockerfile installs CUDA system-wide, removing the extras and letting TensorFlow find CUDA at runtime resolves these conflicts.

🔇 Additional comments (2)
ci/check-params-env.sh (2)

242-247: Updated to Konflux – matches new build pipeline

Switching expected_build_name to "konflux" for the CPU RStudio image correctly syncs the validator with the new Konflux-built artifacts.

Please re-check expected_img_size (currently 1377 MB) after the first Konflux push to be sure it still falls within the ±10 % / ±100 MB thresholds.


257-262: GPU flavour likewise adjusted – looks good

The CUDA RStudio entry now also expects "konflux", keeping both variants consistent.

As above, confirm that the stored size baseline (6541 MB) remains valid once the fresh image is available.

✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai generate unit tests to generate unit tests for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@openshift-ci openshift-ci bot requested review from jesuino and paulovmr July 31, 2025 09:29
@openshift-ci openshift-ci bot added size/xs and removed size/xs labels Jul 31, 2025
@jiridanek
Copy link
Member

/lgtm
it is indeed green

Copy link
Contributor

openshift-ci bot commented Aug 1, 2025

[APPROVALNOTIFIER] This PR is APPROVED

Approval requirements bypassed by manually added approval.

This pull-request has been approved by:

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@atheo89
Copy link
Member Author

atheo89 commented Aug 1, 2025

/override ci/prow/images

Copy link
Contributor

openshift-ci bot commented Aug 1, 2025

@atheo89: Overrode contexts on behalf of atheo89: ci/prow/images

In response to this:

/override ci/prow/images

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

@atheo89
Copy link
Member Author

atheo89 commented Aug 1, 2025

/override ci/prow/images

Copy link
Contributor

openshift-ci bot commented Aug 1, 2025

@atheo89: Overrode contexts on behalf of atheo89: ci/prow/images

In response to this:

/override ci/prow/images

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

@openshift-merge-bot openshift-merge-bot bot merged commit 5f98b7c into opendatahub-io:main Aug 1, 2025
14 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants